131 research outputs found

    Inference in Hybrid Bayesian Networks with Nonlinear Deterministic Conditionals.

    Get PDF
    This is the peer reviewed version of the following article: Cobb, B. R. and Shenoy, P. P. (2017), Inference in Hybrid Bayesian Networks with Nonlinear Deterministic Conditionals. Int. J. Intell. Syst., 32: 1217–1246. doi:10.1002/int.21897, which has been published in final form at https://doi.org/10.1002/int.21897. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.To enable inference in hybrid Bayesian networks (BNs) containing nonlinear deterministic conditional distributions, Cobb and Shenoy in 2005 propose approximating nonlinear deterministic functions by piecewise linear (PL) ones. In this paper, we describe a method for finding PL approximations of nonlinear functions based on a penalized mean square error (MSE) heuristic, which consists of minimizing a penalized MSE function subject to two principles, domain and symmetry. We illustrate our method for some commonly used one-dimensional and two-dimensional nonlinear deterministic functions such as math formula, math formula, math formula, and math formula. Finally, we solve two small examples of hybrid BNs containing nonlinear deterministic conditionals that arise in practice

    On Transforming Belief Function Models to Probability Models

    Get PDF
    In response to reviewer comments on this paper, we have written a shorter and more focused paper: "On the Plausibility Transformation Method for Translating Belief Function Models to Probability Models," University of Kansas School of Business Working Paper No. 308, June 2004, Lawrence, KS.In this paper, we explore methods for transforming a belief function model to an equivalent probability model. We propose and define the properties of a method called the plausibility transformation method. We compare the plausibility transformation method with the pignistic transformation method. These two methods yield qualitatively different probability models. We argue that the plausibility transformation method is the correct method that maintains belief function semantics

    Nonlinear Deterministic Relationships in Bayesian Networks

    Get PDF
    In a Bayesian network with continuous variables containing a variable(s) that is a conditionally deterministic function of its continuous parents, the joint density function does not exist. Conditional linear Gaussian distributions can handle such cases when the deterministic function is linear and the continuous variables have a multi-variate normal distribution. In this paper, operations required for performing inference with nonlinear conditionally deterministic variables are developed. We perform inference in networks with nonlinear deterministic variables and non-Gaussian continuous variables by using piecewise linear approximations to nonlinear functions and modeling probability distributions with mixtures of truncated exponentials (MTE) potentials

    Operations for inference in continuous Bayesian networks with linear deterministic variables

    Get PDF
    An important class of continuous Bayesian networks are those that have linear conditionally deterministic variables (a variable that is a linear deterministic function of its parents). In this case, the joint density function for the variables in the network does not exist. Conditional linear Gaussian (CLG) distributions can handle such cases when all variables are normally distributed. In this paper, we develop operations required for performing inference with linear conditionally deterministic variables in continuous Bayesian networks using relationships derived from joint cumulative distribution functions (CDF’s). These methods allow inference in networks with linear deterministic variables and non-Gaussian distributions

    Hybrid Influence Diagrams Using Mixtures of Truncated Exponentials

    Get PDF
    This is a short 9-pp version of a longer un-published working paper titled "Decision Making with Hybrid Influence Diagrams Using Mixtures of Truncated Exponentials," School of Business Working Paper No. 304, May 2004, Lawrence, KS.Mixtures of truncated exponentials (MTE) potentials are an alternative to discretization for representing continuous chance variables in influence diagrams. Also, MTE potentials can be used to approximate utility functions. This paper introduces MTE influence diagrams, which can represent decision problems without restrictions on the relationships between continuous and discrete chance variables, without limitations on the distributions of continuous chance variables, and without limitations on the nature of the utility functions. In MTE influence diagrams, all probability distributions and the joint utility function (or its multiplicative factors) are represented by MTE potentials and decision nodes are assumed to have discrete state spaces. MTE influence diagrams are solved by variable elimination using a fusion algorithm.Partially supported by a graduate research assistantship to Barry R. Cobb from the Ronald G. Harper Professorship, and by a contract from Sparta, Inc., to Prakash P. Shenoy

    Inference in Hybrid Bayesian Networks with Deterministic Variables

    Get PDF
    An important class of hybrid Bayesian networks are those that have conditionally deterministic variables (a variable that is a deterministic function of its parents). In this case, if some of the parents are continuous, then the joint density function does not exist. Conditional linear Gaussian (CLG) distributions can handle such cases when the deterministic function is linear and continuous variables are normally distributed. In this paper, we develop operations required for performing inference with conditionally deterministic variables using relationships derived from joint cumulative distribution functions (CDF’s). These methods allow inference in networks with deterministic variables where continuous variables are non-Gaussian

    Hybrid Bayesian Networks with Linear Deterministic Variables

    Get PDF
    When a hybrid Bayesian network has conditionally deterministic variables with continuous parents, the joint density function for the continuous variables does not exist. Conditional linear Gaussian distributions can handle such cases when the continuous variables have a multi-variate normal distribution and the discrete variables do not have continuous parents. In this paper, operations required for performing inference with conditionally deterministic variables in hybrid Bayesian networks are developed. These methods allow inference in networks with deterministic variables where continuous variables may be non-Gaussian, and their density functions can be approximated by mixtures of truncated exponentials. There are no constraints on the placement of continuous and discrete nodes in the network

    Approximating probability density functions in hybrid Bayesian networks with mixtures of truncated exponentials

    Get PDF
    Mixtures of truncated exponentials (MTE) potentials are an alternative to discretization and Monte Carlo methods for solving hybrid Bayesian networks. Any probability density function (PDF) can be approximated by an MTE potential, which can always be marginalized in closed form. This allows propagation to be done exactly using the Shenoy-Shafer architecture for computing marginals, with no restrictions on the construction of a join tree. This paper presents MTE potentials that approximate standard PDF’s and applications of these potentials for solving inference problems in hybrid Bayesian networks. These approximations will extend the types of inference problems that can be modeled with Bayesian networks, as demonstrated using three examples

    Parallelizing Heavyweight Debugging Tools with MPIecho *

    Get PDF
    ABSTRACT Idioms created for debugging execution on single processors and multicore systems have been successfully scaled to thousands of processors, but there is little hope that this class of techniques can continue to be scaled out to tens of millions of cores. In order to allow development of more scalable debugging idioms we introduce MPIecho, a novel runtime platform that enables cloning of MPI ranks. Given identical execution on each clone, we then show how heavyweight debugging approaches can be parallelized, reducing their overhead to a fraction of the serialized case. We also show how this platform can be useful in isolating the source of hardwarebased nondeterministic behavior and provide a case study based on a recent processor bug at LLNL. While total overhead will depend on the individual tool, we show that the platform itself contributes little: 512x tool parallelization incurs at worst 2x overhead across the NAS Parallel benchmarks, hardware fault isolation contributes at worst an additional 44% overhead. Finally, we show how MPIecho can lead to near-linear reduction in overhead when combined with Maid, a heavyweight memory tracking tool provided with Intel's Pin platform. We demonstrate overhead reduction from 1, 466% to 53% and from 740% to 14% for cg.D.64 and lu.D.64, respectively, using only an additional 64 cores

    Rasip1-Mediated Rho GTPase Signaling Regulates Blood Vessel Tubulogenesis via Nonmuscle Myosin IINovelty and Significance

    Get PDF
    Vascular tubulogenesis is essential to cardiovascular development. Within initial vascular cords of endothelial cells (ECs), apical membranes are established and become cleared of cell-cell junctions, thereby allowing continuous central lumens to open. Rasip1 is required for apical junction clearance, as well as for regulation of Rho GTPase activity. However, it remains unknown how activities of different Rho GTPases are coordinated by Rasip1 to direct tubulogenesis
    • …
    corecore